6 research outputs found

    Recognition of emotions using Kinects

    Get PDF
    Abstract Emotion recognition can improve the quality of patient care, product development and human-machine interaction. Psychological studies indicate that emotional state can be expressed in the way people walk, and the human gait can be used to reveal a person's emotional state. This paper proposes a novel method to do emotion recognition by using Microsoft Kinect to record gait patterns and train machine learning algorithms for emotion recognition. 59 subjects are recruited, and their gait patterns are recorded by two Kinect cameras. Joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation are used for data preprocessing. We run Fourier transformation to extract features from the gait patterns and utilize Principal Component Analysis(PCA) for feature selection. By using NaiveBayes, RandomForests, LibSVM and SMO classifiers, the accuracy of recognition between natural and angry emotions can reach 80%, and the accuracy of recognition between natural and happy emotions can reach above 70%. The result indicates that Kinect can be used in the recognition of emotions with fairly well performance

    Identifying Emotions from Non-Contact Gaits Information Based on Microsoft Kinects

    No full text
    Automatic emotion recognition from gaits information is discussed in this paper, which has been investigated widely in the fields of human-machine interaction, psychology, psychiatry, behavioral science, etc. The gaits information is non-contact, collected from Microsoft kinects, and contains 3-dimensional coordinates of 25 joints per person. These joints coordinates vary with the time. So, by the discrete Fourier transform and statistic methods, some time-frequency features related to neutral, happy and angry emotion are extracted and used to establish the classification model to identify these three emotions. Experimental results show this model works very well, and time-frequency features are effective in characterizing and recognizing emotions for this non-contact gait data. In particular, by the optimization algorithm, the recognition accuracy can be further averagely improved by about 13.7 percent

    Detecting depression from Internet behaviors by time-frequency features

    No full text
    Early detection of depression is important to improve human well-being. This paper proposes a new method to detect depression through time-frequency analysis of Internet behaviors. We recruited 728 postgraduate students and obtained their scores on a depression questionnaire (Zung Self-rating Depression Scale, SDS) and digital records of Internet behaviors. By timefrequency analysis, classification models are built to differentiate higher SDS group from lower group, and prediction models are built to identify mental status of depressed group more precisely. Experimental results show classification and prediction models work well, and time-frequency features are effective in capturing the changes of mental health status. Results of this paper are useful to improve the performance of public mental health services.</p

    Predicting Depression from Internet Behaviors by Timefrequency Features

    No full text
    Early detection of depression is important to improve human well-being. This paper proposes a new method to detect depression through time-frequency analysis of Internet behaviors. We recruited 728 postgraduate students and obtained their scores on a depression questionnaire (Zung Selfrating Depression Scale, SDS) and digital records of Internet behaviors. By time-frequency analysis, we built classification models for differentiating higher SDS group from lower group and prediction models for identifying mental status of depressed group more precisely. Experimental results show classification and prediction models work well, and time-frequency features are effective in capturing the changes of mental health status. Results of this paper might be useful to improve the performance of public mental health services.</p

    Emotion recognition using Kinect motion capture data of human gaits

    No full text
    Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker’s emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants’ gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements
    corecore